1,662 research outputs found

    Standard Model Higgs Searches at the LHC

    Get PDF
    The study of the mechanism behind electroweak symmetry breaking is one of the main goals of the Large Hadron Collider and of its general-purpose experiments, ATLAS and CMS. This paper reviews some of the ongoing studies by these collaborations and, when possible, highlights the differences between equivalent channels in both experiments.Comment: Parallel talk at ICHEP08, Philadelphia, USA, July 2008. 4 pages, LaTeX, 8 eps figure

    Commissioning of the ATLAS Level-1 Trigger with Cosmic Rays

    Full text link
    The ATLAS detector at CERN's Large Hadron Collider will be exposed to proton-proton collisions from beams crossing at 40 MHz. A three-level trigger system was designed to select potentially interesting events and reduce the incoming rate to 100-200 Hz. The first trigger level (LVL1) is implemented in custom-built electronics, the second and third trigger levels are realized in software. Based on calorimeter information and hits in dedicated muon-trigger detectors, the LVL1 decision is made by the central-trigger processor yielding an output rate of less than 100 kHz. The allowed latency for the trigger decision at this stage is less than 2.5 microseconds. Installation of the final LVL1 trigger system at the ATLAS site is in full swing, to be completed later this year. We present a status report of the main components of the first-level trigger and the in-situ commissioning of the full trigger chain with cosmic-ray muons.Comment: On behalf of the ATLAS TDAQ Level-1 Trigger Group. Proceedings for 2007 Europhysics Conference on High Energy Physics, Manchester, July 200

    The ATLAS trigger: high-level trigger commissioning and operation during early data taking

    Get PDF
    The ATLAS experiment is one of the two general-purpose experiments due to start operation soon at the Large Hadron Collider (LHC). The LHC will collide protons at a centre of mass energy of 14~TeV, with a bunch-crossing rate of 40~MHz. The ATLAS three-level trigger will reduce this input rate to match the foreseen offline storage capability of 100-200~Hz. After the Level 1 trigger, which is implemented in custom hardware, the High-Level Trigger (HLT) further reduces the rate from up to 100~kHz to the offline storage rate while retaining the most interesting physics. The HLT is implemented in software running in commercially available computer farms and consists of Level 2 and Event Filter. To reduce the network data traffic and the processing time to manageable levels, the HLT uses seeded, step-wise reconstruction, aiming at the earliest possible rejection. Data produced during LHC commissioning will be vital for calibrating and aligning sub-detectors, as well as for testing the ATLAS trigger and setting up the online event selection (or trigger menu). This will be done initially using zero-bias and minimum-bias collisions for system calibration and checks. As the LHC luminosity increases, the trigger menu will need to adapt so that the available output storage rate is used optimally to maximize the ATLAS physics potential. In preparation for LHC collision data, the functionality of the data-acquisition system and the HLT software was demonstrated using part of the readout chain to supply simulated events in raw-data format to a subset of the HLT farm. These tests included a complete set of physics triggers as well as configuration and monitoring. We give an overview of the ATLAS High Level Trigger focusing on the system design and its innovative features. We then present the ATLAS trigger strategy for the initial phase of LHC exploitation, up to a luminosity of 103110^{31} s−1^{-1}cm−2^{-2}. Emphasis will be given to the full trigger menus, including physics and calibration triggers. Finally, we report on the valuable experience acquired through in-situ commissioning of the system where simulated events were used to exercise the trigger chain. In particular we show critical quantities such as event processing times, measured in a large-scale HLT farm using a complex trigger menu

    Validation of questionnaire algorithm based on repeated open application testing with the constituents of fragrance mix II:the EDEN Fragrance Study

    Get PDF
    Background: In a European study on contact allergy in the general population, it has been hypothesized that the combination of contact allergy to a fragrance together with a history indicating dermatitis at exposure and thereafter subsequent avoidance of scented products implied a diagnosis of allergic contact dermatitis. Objectives: The primary aim of this study was to validate this hypothesis/algorithm. The secondary aim was to investigate whether there was any association between the outcome of the recent repeated open application test (ROAT) and the patch test reactivity. Methods: One hundred nine subjects with and without contact allergy to fragrance mix II (FM II) were recruited. Volunteers from six European dermatology clinics participated in the study including a patch test and a ROAT. Results: Twenty-four positive ROAT reactions were noted in total including 20 of those 32 with contact allergy to FM II. None of the volunteers reacted to the vehicle (P < 0.001). More individuals with a positive algorithm had positive ROATs when compared with those with a negative algorithm. However, the difference was not statistically significant (P = 0.12). The lower the patch test concentration eliciting a positive test reaction, the more likely was a positive ROAT and the more likely that the positive ROAT appeared early during the investigative period. Conclusions: The algorithm used in this study was not validated but it was indicated in this ROAT setup. The stronger the patch test reactivity the more likely was a positive ROAT and the more likely it was that the positive ROAT appeared early during the application period

    Validation of a questionnaire algorithm based on repeated open application testing with the constituents of fragrance mix I

    Get PDF
    Background In a European study on contact allergy in the general population, it was hypothesized that the combination of contact allergy to a fragrance together with a history indicating dermatitis at exposure, and thereafter subsequent avoidance of scented products, implied a diagnosis of allergic contact dermatitis. Objectives The primary aim of this study was to validate this hypothesis and algorithm. The secondary aim was to investigate whether there was any association between the outcome of the repeated open application test (ROAT) and the patch test reactivity. Methods In total, 109 patients with and without contact allergy to fragrance mix (FM) I were recruited. Volunteers from six European dermatology clinics participated in the study including a patch test and a ROAT. Results Positive ROAT reactions were noted in 26 of the 44 volunteers with contact allergy to FM I. None of the volunteers reacted to the vehicle (P <0 center dot 001). More individuals with a positive algorithm had positive ROATs than those with a negative algorithm. However, the difference was not statistically significant. The lower the patch test concentration eliciting a positive test reaction, the more likely a positive ROAT and the more likely that the positive ROAT appeared early during the investigative period. Conclusions The algorithm used in this study was not substantiated in this ROAT set-up. The stronger the patch test reactivity the more likely was a positive ROAT and the more likely it was that the positive ROAT appeared early during the application period. What's already known about this topic? To the best of our knowledge, a scientifically designed and conducted repeated open application test (ROAT) has never been performed before to validate a diagnosis of allergic contact dermatitis partly based on a questionnaire. What does this study add? This is the largest controlled, randomized and blinded ROAT performed to date. Higher patch test reactivity to fragrance mix I indicated a greater likelihood of a positive ROAT

    Chelator free gallium-68 radiolabelling of silica coated iron oxide nanorods via surface interactions

    Get PDF
    The commercial availability of combined magnetic resonance imaging (MRI)/positron emission tomography (PET) scanners for clinical use has increased demand for easily prepared agents which offer signal or contrast in both modalities. Herein we describe a new class of silica coated iron–oxide nanorods (NRs) coated with polyethylene glycol (PEG) and/or a tetraazamacrocyclic chelator (DO3A). Studies of the coated NRs validate their composition and confirm their properties as in vivo T₂ MRI contrast agents. Radiolabelling studies with the positron emitting radioisotope gallium-68 (t1/2 = 68 min) demonstrate that, in the presence of the silica coating, the macrocyclic chelator was not required for preparation of highly stable radiometal-NR constructs. In vivo PET-CT and MR imaging studies show the expected high liver uptake of gallium-68 radiolabelled nanorods with no significant release of gallium-68 metal ions, validating our innovation to provide a novel simple method for labelling of iron oxide NRs with a radiometal in the absence of a chelating unit that can be used for high sensitivity liver imaging

    Sequencing of 53,831 diverse genomes from the NHLBI TOPMed Program

    Get PDF
    The Trans-Omics for Precision Medicine (TOPMed) programme seeks to elucidate the genetic architecture and biology of heart, lung, blood and sleep disorders, with the ultimate goal of improving diagnosis, treatment and prevention of these diseases. The initial phases of the programme focused on whole-genome sequencing of individuals with rich phenotypic data and diverse backgrounds. Here we describe the TOPMed goals and design as well as the available resources and early insights obtained from the sequence data. The resources include a variant browser, a genotype imputation server, and genomic and phenotypic data that are available through dbGaP (Database of Genotypes and Phenotypes)(1). In the first 53,831 TOPMed samples, we detected more than 400 million single-nucleotide and insertion or deletion variants after alignment with the reference genome. Additional previously undescribed variants were detected through assembly of unmapped reads and customized analysis in highly variable loci. Among the more than 400 million detected variants, 97% have frequencies of less than 1% and 46% are singletons that are present in only one individual (53% among unrelated individuals). These rare variants provide insights into mutational processes and recent human evolutionary history. The extensive catalogue of genetic variation in TOPMed studies provides unique opportunities for exploring the contributions of rare and noncoding sequence variants to phenotypic variation. Furthermore, combining TOPMed haplotypes with modern imputation methods improves the power and reach of genome-wide association studies to include variants down to a frequency of approximately 0.01%

    MaCH: Using sequence and genotype data to estimate haplotypes and unobserved genotypes

    Get PDF
    Genome‐wide association studies (GWAS) can identify common alleles that contribute to complex disease susceptibility. Despite the large number of SNPs assessed in each study, the effects of most common SNPs must be evaluated indirectly using either genotyped markers or haplotypes thereof as proxies. We have previously implemented a computationally efficient Markov Chain framework for genotype imputation and haplotyping in the freely available MaCH software package. The approach describes sampled chromosomes as mosaics of each other and uses available genotype and shotgun sequence data to estimate unobserved genotypes and haplotypes, together with useful measures of the quality of these estimates. Our approach is already widely used to facilitate comparison of results across studies as well as meta‐analyses of GWAS. Here, we use simulations and experimental genotypes to evaluate its accuracy and utility, considering choices of genotyping panels, reference panel configurations, and designs where genotyping is replaced with shotgun sequencing. Importantly, we show that genotype imputation not only facilitates cross study analyses but also increases power of genetic association studies. We show that genotype imputation of common variants using HapMap haplotypes as a reference is very accurate using either genome‐wide SNP data or smaller amounts of data typical in fine‐mapping studies. Furthermore, we show the approach is applicable in a variety of populations. Finally, we illustrate how association analyses of unobserved variants will benefit from ongoing advances such as larger HapMap reference panels and whole genome shotgun sequencing technologies

    Deep-coverage whole genome sequences and blood lipids among 16,324 individuals.

    Get PDF
    Large-scale deep-coverage whole-genome sequencing (WGS) is now feasible and offers potential advantages for locus discovery. We perform WGS in 16,324 participants from four ancestries at mean depth &gt;29X and analyze genotypes with four quantitative traits-plasma total cholesterol, low-density lipoprotein cholesterol (LDL-C), high-density lipoprotein cholesterol, and triglycerides. Common variant association yields known loci except for few variants previously poorly imputed. Rare coding variant association yields known Mendelian dyslipidemia genes but rare non-coding variant association detects no signals. A high 2M-SNP LDL-C polygenic score (top 5th percentile) confers similar effect size to a monogenic mutation (~30 mg/dl higher for each); however, among those with severe hypercholesterolemia, 23% have a high polygenic score and only 2% carry a monogenic mutation. At these sample sizes and for these phenotypes, the incremental value of WGS for discovery is limited but WGS permits simultaneous assessment of monogenic and polygenic models to severe hypercholesterolemia

    Triggering events with GPUs at ATLAS

    Get PDF
    The growing complexity of events produced in LHC collisions demands increasing computing power both for the online selection and for the offline reconstruction of events. In recent years there have been significant advances in the performance of Graphics Processing Units (GPUs) both in terms of increased compute power and reduced power consumption that make GPUs extremely attractive for use in a complex particle physics experiments such as ATLAS. A small scale prototype of the full ATLAS High Level Trigger has been implemented that exploits reconstruction algorithms optimized for this new massively parallel paradigm. We discuss the integration procedure followed for this prototype and present the performance achieved and the prospects for the future.Peer Reviewe
    • 

    corecore